# Lightweight RoBERTa
Distilroberta Nli
Apache-2.0
This model is a lightweight natural language inference model based on DistilRoBERTa, supporting zero-shot classification tasks.
Text Classification
Transformers English

D
matekadlicsko
18
0
Distilroberta Base Finetuned Wikitext2
Apache-2.0
This model is a fine-tuned version of distilroberta-base on the wikitext2 dataset, primarily used for text generation tasks.
Large Language Model
Transformers

D
lamyae
79
0
Distilroberta Base SmithsModel2
Apache-2.0
A fine-tuned model based on distilroberta-base, suitable for specific NLP tasks
Large Language Model
Transformers

D
stevems1
22
0
All Distilroberta V1 Finetuned DIT 10 Epochs
Apache-2.0
Text embedding model fine-tuned from all-distilroberta-v1, trained for 10 epochs with a validation loss of 0.0044
Text Embedding
Transformers

A
veddm
25
0
Distilroberta Base 1
Apache-2.0
A fine-tuned version based on the distilroberta-base model, suitable for text-related tasks
Large Language Model
Transformers

D
uhlenbeckmew
56
0
Distilroberta Base Finetuned Wikitext2
Apache-2.0
This model is a fine-tuned version of distilroberta-base on the wikitext2 dataset, primarily used for text generation tasks.
Large Language Model
Transformers

D
Rawat29
47
0
Distilroberta Base Squad V2
Apache-2.0
A QA model fine-tuned on SQuAD2.0 dataset based on distilroberta-base, supporting unanswerable questions
Question Answering System
Transformers English

D
squirro
24
1
Envibert
envibert is a bilingual model based on the RoBERTa architecture, supporting Vietnamese and English processing, optimized for production environments.
Large Language Model
Transformers Other

E
nguyenvulebinh
84
5
Distilroberta Base Squad2
A Q&A model fine-tuned on the SQuAD v2 dataset based on DistilRoBERTa-base, featuring lightweight and efficient characteristics.
Question Answering System
D
twmkn9
22
0
Codebertapy
CodeBERTaPy is a RoBERTa-like model trained on GitHub's CodeSearchNet dataset specifically for Python language, designed for code optimization.
Large Language Model Other
C
mrm8488
66
4
Babyberta 1
A lightweight RoBERTa variant trained on 5 million American English child-directed corpora, specifically designed for language acquisition research
Large Language Model
Transformers English

B
phueb
295
3
Distilroberta Finetuned Tweets Hate Speech
This is a model fine-tuned on a tweet hate speech detection dataset, designed to identify and classify hate speech on social media.
Text Classification English
D
mrm8488
23
6
Distilroberta Base Ner Wikiann Conll2003 4 Class
Apache-2.0
A named entity recognition model based on DistilRoBERTa-base, fine-tuned on the wikiann and conll2003 datasets, supporting 4-class entity recognition.
Sequence Labeling
Transformers

D
philschmid
16
0
Distilroberta Base Finetuned Wikitext2
Apache-2.0
This model is a fine-tuned version of distilroberta-base on the wikitext2 dataset, primarily used for text generation tasks.
Large Language Model
Transformers

D
lucius
37
0
Distilroberta Finetuned Age News Classification
This is a news classification model fine-tuned on the AG News dataset based on the distilroberta-base model, achieving an accuracy of 0.94 on the test set.
Text Classification English
D
mrm8488
59
1
Distilroberta Base Ner Conll2003
Apache-2.0
Named entity recognition model fine-tuned on the CoNLL2003 dataset based on distilroberta-base
Sequence Labeling
Transformers

D
philschmid
103
3
Babyberta 2
BabyBERTa is a lightweight version of RoBERTa, trained on child-directed input and specifically designed for language acquisition research.
Large Language Model
Transformers English

B
phueb
94
0
Babyberta 3
MIT
BabyBERTa is a lightweight version based on RoBERTa, specifically designed for language acquisition research, trained on a 5-million-word corpus of American English child-directed input.
Large Language Model
Transformers English

B
phueb
22
0
Distilroberta Base Testingsb Testingsb
Apache-2.0
This model is a fine-tuned version of distilroberta-base on an unknown dataset, primarily used for text processing tasks.
Large Language Model
Transformers

D
MistahCase
30
0
Bertatweetgr
A lightweight RoBERTa fill-mask model primarily trained on Greek tweets
Large Language Model Other
B
Konstantinos
50
1
Distilroberta Base Finetuned Wikitext2
Apache-2.0
This model is a fine-tuned version of distilroberta-base on the wikitext2 dataset, primarily designed for text generation tasks.
Large Language Model
Transformers

D
Roy029
26
0
Distilroberta Base Finetuned Wikitext2
Apache-2.0
A fine-tuned version of the distilroberta-base model on the wikitext2 dataset, suitable for text-related tasks
Large Language Model
Transformers

D
Rocketknight1
17
0
Featured Recommended AI Models